Time-Varying Autoregression with Low-Rank Tensors

نویسندگان

چکیده

Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 14 May 2020Accepted: 16 July 2021Published online: 18 November 2021Keywordstimeseries, tensor factorization, autoregression, data-driven modelAMS Subject Headings37M10, 62M10, 37N30, 15A69, 47A80Publication DataISSN (online): 1536-0040Publisher: Society for Industrial and Applied MathematicsCODEN: sjaday

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low-Rank Tensors for Scoring Dependency Structures

Accurate scoring of syntactic structures such as head-modifier arcs in dependency parsing typically requires rich, highdimensional feature representations. A small subset of such features is often selected manually. This is problematic when features lack clear linguistic meaning as in embeddings or when the information is blended across features. In this paper, we use tensors to map high-dimens...

متن کامل

Random Projections for Low Multilinear Rank Tensors

We proposed two randomized tensor algorithms for reducing multilinear ranks in the Tucker format. The basis of these randomized algorithms is from the randomized SVD of Halko, Martinsson and Tropp [9]. Here we provide randomized versions of the higher order SVD and higher order orthogonal iteration. Moreover, we provide a sharper probabilistic error bounds for the matrix low rank approximation....

متن کامل

Multilinear Low-Rank Tensors on Graphs & Applications

We propose a new framework for the analysis of lowrank tensors which lies at the intersection of spectral graph theory and signal processing. As a first step, we present a new graph based low-rank decomposition which approximates the classical low-rank SVD for matrices and multilinear SVD for tensors. Then, building on this novel decomposition we construct a general class of convex optimization...

متن کامل

Embedding Lexical Features via Low-Rank Tensors

Modern NLP models rely heavily on engineered features, which often combine word and contextual information into complex lexical features. Such combination results in large numbers of features, which can lead to overfitting. We present a new model that represents complex lexical features—comprised of parts for words, contextual information and labels—in a tensor that captures conjunction informa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Siam Journal on Applied Dynamical Systems

سال: 2021

ISSN: ['1536-0040']

DOI: https://doi.org/10.1137/20m1338058